Note on a method of conjugate subgradients for minimizing nondifferentiable functions

نویسنده

  • Philip Wolfe
چکیده

An algorithm is described for finding the minimum of any convex, not necessarily differentiable, function f of several variables. The algorithm yields a sequence of points tending to the solution of the problem, if any, requiring only the calculation o f f and one subgradient o f f at designated points. Its rate of convergence is estimated for convex and for differentiable convex functions. For the latter, it is an extension of the method of conjugate gradients and terminates for quadratic functions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Note on an extension of "Davidon" methods to nondifferentiable functions

1. This note summarizes a paper [4] to appear in full elsewhere. It presents an algorithm for the minimization of a general (not necessarily differentiable) convex function. Its central idea is the construction of descent directions as projections of the origin onto the convex hull of previously calculated subgradients as long as satisfactory progress can be made. Using projection to obtain a d...

متن کامل

Use of Differentiable and Nondifferentiable Optimization Algorithms for Variational Data Assimilation with Discontinuous Cost Functions

Cost functions formulated in four-dimensional variational data assimilation (4DVAR) are nonsmooth in the presence of discontinuous physical processes (i.e., the presence of ‘‘on–off’’ switches in NWP models). The adjoint model integration produces values of subgradients, instead of gradients, of these cost functions with respect to the model’s control variables at discontinuous points. Minimiza...

متن کامل

Incremental Subgradient Methods for Nondifferentiable Optimization

We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to perform the subgradient iteration incrementally, by sequentially taking steps along the subgradien...

متن کامل

Incremental Subgradient Methods 1 for Nondifferentiable Optimization

We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to perform the subgradient iteration incrementally, by sequentially taking steps along the subgradien...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 7  شماره 

صفحات  -

تاریخ انتشار 1974